A Novel Training Protocol for Performance Predictors of Evolutionary Neural Architecture Search Algorithms

نویسندگان

چکیده

Evolutionary neural architecture search (ENAS) can automatically design the architectures of deep networks (DNNs) using evolutionary computation algorithms. However, most ENAS algorithms require an intensive computational resource, which is not necessarily available to users interested. Performance predictors are a type regression models assist accomplish search, while without exerting much resource. Despite various performance have been designed, they employ same training protocol build models: 1) sampling set DNNs with as dataset; 2) model mean square error criterion; and 3) predicting newly generated during ENAS. In this article, we point out that three steps constituting well thought-out through intuitive illustrative examples. Furthermore, propose new address these issues, consisting designing pairwise ranking indicator construct target, proposing use logistic fit samples, developing differential method instances. To verify effectiveness proposed protocol, four widely used in field machine learning chosen perform comparisons on two benchmark datasets. The experimental results all demonstrate significantly improve prediction accuracy against traditional protocols.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Comparing Evolutionary Strategy Algorithms for Training Spiking Neural Networks

Spiking Neural Networks are considered as the third generation of Artificial Neural Networks, these neural networks naturally process spatio-temporal information. Spiking Neural Networks have been used in several fields and application areas; pattern recognition among them. For dealing with supervised pattern recognition task a gradientdescent based learning rule (Spike-prop) has been developed...

متن کامل

Evolutionary Algorithms for Integer Weight Neural Network Training

In this work differential evolution strategies are applied in neural networks with integer weights training. These strategies have been introduced by Storn and Price [Journal of Global Optimization, 11, pp. 341–359, 1997]. Integer weight neural networks are better suited for hardware implementation as compared with their real weight analogous. Our intention is to give a broad picture of the beh...

متن کامل

Accelerating Neural Architecture Search using Performance Prediction

Methods for neural network hyperparameter optimization and meta-modeling are computationally expensive due to the need to train a large number of model configurations. In this paper, we show that standard frequentist regression models can predict the final performance of partially trained model configurations using features based on network architectures, hyperparameters, and time-series valida...

متن کامل

A novel local search method for microaggregation

In this paper, we propose an effective microaggregation algorithm to produce a more useful protected data for publishing. Microaggregation is mapped to a clustering problem with known minimum and maximum group size constraints. In this scheme, the goal is to cluster n records into groups of at least k and at most 2k_1 records, such that the sum of the within-group squ...

متن کامل

Novel Hybrid Fuzzy-Evolutionary Algorithms for Optimization of a Fuzzy Expert System Applied to Dust Phenomenon Forecasting Problem

Nowadays, dust phenomenon is one of the important challenges in warm and dry areas. Forecasting the phenomenon before its occurrence helps to take precautionary steps to prevent its consequences. Fuzzy expert systems capabilities have been taken into account to assist and cope with the uncertainty associated to complex environments such as dust forecasting problem. This paper presents novel hyb...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Evolutionary Computation

سال: 2021

ISSN: ['1941-0026', '1089-778X']

DOI: https://doi.org/10.1109/tevc.2021.3055076